A neural network based on the generalized FB function for nonlinear convex programs with second-order cone constraints

نویسندگان

  • Xinhe Miao
  • Jein-Shan Chen
  • Chun-Hsu Ko
چکیده

This paper proposes a neural network approach to efficiently solve nonlinear convex programs with the second-order cone constraints. The neural network model is designed by the generalized Fischer–Burmeister function associated with second-order cone. We study the existence and convergence of the trajectory for the considered neural network. Moreover, we also show stability properties for the constability. Illustrative examples give a further demonstration for the effectiveness of the proposed neural network. Numerical performance based on the parameter being perturbed and numerical comparison with other neural network models are also provided. In overall, our model performs better than two comparative methods. & 2016 Elsevier B.V. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A smoothed NR neural network for solving nonlinear convex programs with second-order cone constraints

Abstract. This paper proposes a neural network approach for efficiently solving general nonlinear convex programs with second-order cone constraints. The proposed neural network model was developed based on a smoothed natural residual merit function involving an unconstrained minimization reformulation of the complementarity problem. We study the existence and convergence of the trajectory of t...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

An efficient modified neural network for solving nonlinear programming problems with hybrid constraints

This paper presents ‎‎the optimization techniques for solving‎‎ convex programming problems with hybrid constraints‎.‎ According to the saddle point theorem‎, ‎optimization theory‎, ‎convex analysis theory‎, ‎Lyapunov stability theory and LaSalle‎‎invariance principle‎,‎ a neural network model is constructed‎.‎ The equilibrium point of the proposed model is proved to be equivalent to the optima...

متن کامل

Perspective reformulations of mixed integer nonlinear programs with indicator variables

We study mixed integer nonlinear programs (MINLP)s that are driven by a collection of indicator variables where each indicator variable controls a subset of the decision variables. An indicator variable, when it is “turned off”, forces some of the decision variables to assume fixed values, and, when it is “turned on”, forces them to belong to a convex set. Many practical MINLPs contain integer ...

متن کامل

Perspective Relaxation of Minlps with Indicator Variables

We study mixed integer nonlinear programs (MINLP) that are driven by a collection of indicator variables where each indicator variable controls a subset of the decision variables. An indicator variable, when it is “turned off”, forces some of the decision variables to assume a fixed value, and, when it is “turned on”, forces them to belong to a convex set. Most of the integer variables in known...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 203  شماره 

صفحات  -

تاریخ انتشار 2016